Extending AdaBoost to Iteratively Vary Its Base Classifiers
نویسندگان
چکیده
This paper introduces AdaBoost Dynamic, an extension of AdaBoost.M1 algorithm by Freund and Shapire. In this extension we use different “weak” classifiers in subsequent iterations of the algorithm, instead of AdaBoost’s fixed base classifier. The algorithm is tested with various datasets from UCI database, and results show that the algorithm performs equally well as AdaBoost with the best possible base learner for a given dataset. This result therefore relieves a machine learning analyst from having to decide which base classifier to use.
منابع مشابه
Boosting of Fuzzy Rules with Low Quality Data
An extension of the Adaboost algorithm is proposed for obtaining fuzzy rule based classifiers from imprecisely perceived data. Isolated fuzzy rules are regarded as weak learners, and knowledge bases as ensembles. Rules are iteratively added to a base, and the search of the best rule at each iteration is carried out by a genetic algorithm driven by a fuzzy fitness function. The successive weight...
متن کاملLeveraging for Regression
In this paper we examine master regression algorithms that leverage base regressors by iteratively calling them on modified samples. The most successful leveraging algorithm for classification is AdaBoost, an algorithm that requires only modest assumptions on the base learning method for its good theoretical bounds. We present three gradient descent leveraging algorithms for regression and prov...
متن کاملA Regularized Version of Adaboost for Pattern Classification in Historic Air Photographs
In this work, we present a novel classification method for geoinformatics tasks, based on a regularized version of the AdaBoost algorithm implemented in the GIS GRASS. AdaBoost is a machine learning classification technique based on a weighted combination of different realizations of a same base model. AdaBoost calls a given base learning algorithm iteratively in a series of runs: at each run, ...
متن کاملCombining Active Learning and Boosting for Naïve Bayes Text Classifiers
This paper presents a variant of the AdaBoost algorithm for boosting Näıve Bayes text classifier, called AdaBUS, which combines active learning with boosting algorithm. Boosting has been evaluated to effectively improve the accuracy of machine-learning based classifiers. However, Näıve Bayes classifier, which is remarkably successful in practice for text classification problems, is known not to...
متن کاملOne-Pass Boosting
This paper studies boosting algorithms that make a single pass over a set of base classifiers. We first analyze a one-pass algorithm in the setting of boosting with diverse base classifiers. Our guarantee is the same as the best proved for any boosting algorithm, but our one-pass algorithm is much faster than previous approaches. We next exhibit a random source of examples for which a “picky” v...
متن کامل